On the Performance of Sparse Recovery via

نویسندگان

  • Meng Wang
  • Weiyu Xu
  • Ao Tang
چکیده

It is known that a high-dimensional sparse vector x∗ in Rn can be recovered from low-dimensional measurements y = Ax∗ where Am×n(m < n) is the measurement matrix. In this paper, with A being a random Gaussian matrix, we investigate the recovering ability of `p-minimization (0 ≤ p ≤ 1) as p varies, where `p-minimization returns a vector with the least `p quasi-norm among all the vectors x satisfying Ax = y. Besides analyzing the performance of strong recovery where `pminimization is required to recover all the sparse vectors up to certain sparsity, we also for the first time analyze the performance of “weak” recovery of `p-minimization (0 ≤ p < 1) where the aim is to recover all the sparse vectors on one support with a fixed sign pattern. When α(:= m n ) → 1, we provide sharp thresholds of the sparsity ratio (i.e. percentage of nonzero entries of a vector) that differentiates the success and failure via `p-minimization. For strong recovery, the threshold strictly decreases from 0.5 to 0.239 as p increases from 0 to 1. Surprisingly, for weak recovery, the threshold is 2/3 for all p in [0, 1), while the threshold is 1 for `1-minimization. We also explicitly demonstrate that `pminimization (p < 1) can return a denser solution than `1minimization. For any α ∈ (0, 1), we provide bounds of the sparsity ratio for strong recovery and weak recovery respectively below which `p-minimization succeeds. Our bound of strong recovery improves on the existing bounds when α is large. In particular, regarding the recovery threshold, this paper argues that `p-minimization has a higher threshold with smaller p for strong recovery; the threshold is the same for all p for sectional recovery; and `1-minimization can outperform `p-minimization for weak recovery. These are in contrast to traditional wisdom that `p-minimization, though computationally more expensive, always has better sparse recovery ability than `1-minimization since it is closer to `0-minimization. Finally, we provide an intuitive explanation to our findings. Numerical examples are also used to unambiguously confirm and illustrate the theoretical predictions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New IRIS Segmentation Method Based on Sparse Representation

Iris recognition is one of the most reliable methods for identification. In general, itconsists of image acquisition, iris segmentation, feature extraction and matching. Among them, iris segmentation has an important role on the performance of any iris recognition system. Eyes nonlinear movement, occlusion, and specular reflection are main challenges for any iris segmentation method. In thi...

متن کامل

A New IRIS Segmentation Method Based on Sparse Representation

Iris recognition is one of the most reliable methods for identification. In general, itconsists of image acquisition, iris segmentation, feature extraction and matching. Among them, iris segmentation has an important role on the performance of any iris recognition system. Eyes nonlinear movement, occlusion, and specular reflection are main challenges for any iris segmentation method. In thi...

متن کامل

Image Classification via Sparse Representation and Subspace Alignment

Image representation is a crucial problem in image processing where there exist many low-level representations of image, i.e., SIFT, HOG and so on. But there is a missing link across low-level and high-level semantic representations. In fact, traditional machine learning approaches, e.g., non-negative matrix factorization, sparse representation and principle component analysis are employed to d...

متن کامل

Deblocking Joint Photographic Experts Group Compressed Images via Self-learning Sparse Representation

JPEG is one of the most widely used image compression method, but it causes annoying blocking artifacts at low bit-rates. Sparse representation is an efficient technique which can solve many inverse problems in image processing applications such as denoising and deblocking. In this paper, a post-processing method is proposed for reducing JPEG blocking effects via sparse representation. In this ...

متن کامل

A Sharp Sufficient Condition for Sparsity Pattern Recovery

Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...

متن کامل

Face Recognition in Thermal Images based on Sparse Classifier

Despite recent advances in face recognition systems, they suffer from serious problems because of the extensive types of changes in human face (changes like light, glasses, head tilt, different emotional modes). Each one of these factors can significantly reduce the face recognition accuracy. Several methods have been proposed by researchers to overcome these problems. Nonetheless, in recent ye...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011